1,381 research outputs found

    Novel methods of fabrication and metrology of superconducting nanostructures

    Get PDF
    As metrology extends toward the nanoscale, a number of potential applications and new challenges arise. By combining photolithography with focused ion beam and/or electron beam methods, superconducting quantum interference devices (SQUIDs) with loop dimensions down to 200 nm and superconducting bridge dimensions of the order 80 nm have been produced. These SQUIDs have a range of potential applications. As an illustration, we describe a method for characterizing the effective area and the magnetic penetration depth of a structured superconducting thin film in the extreme limit, where the superconducting penetration depth lambdalambda is much greater than the film thickness and is comparable with the lateral dimensions of the device

    Strategically managing learning during perceptual decision making

    Get PDF
    Making optimal decisions in the face of noise requires balancing short-term speed and accuracy. But a theory of optimality should account for the fact that short-term speed can influence long-term accuracy through learning. Here, we demonstrate that long-term learning is an important dynamical dimension of the speed-accuracy trade-off. We study learning trajectories in rats and formally characterize these dynamics in a theory expressed as both a recurrent neural network and an analytical extension of the drift-diffusion model that learns over time. The model reveals that choosing suboptimal response times to learn faster sacrifices immediate reward, but can lead to greater total reward. We empirically verify predictions of the theory, including a relationship between stimulus exposure and learning speed, and a modulation of reaction time by future learning prospects. We find that rats' strategies approximately maximize total reward over the full learning epoch, suggesting cognitive control over the learning process

    Clinical identification of bacteria in human chronic wound infections: Culturing vs. 16S ribosomal DNA sequencing

    Get PDF
    Background: Chronic wounds affect millions of people and cost billions of dollars in the United States each year. These wounds harbor polymicrobial biofilm communities, which can be difficult to elucidate using culturing methods. Clinical molecular microbiological methods are increasingly being employed to investigate the microbiota of chronic infections, including wounds, as part of standard patient care. However, molecular testing is more sensitive than culturing, which results in markedly different results being reported to clinicians. This study compares the results of aerobic culturing and molecular testing (culture-free 16S ribosomal DNA sequencing), and it examines the relative abundance score that is generated by the molecular test and the usefulness of the relative abundance score in predicting the likelihood that the same organism would be detected by culture.Methods: Parallel samples from 51 chronic wounds were studied using aerobic culturing and 16S DNA sequencing for the identification of bacteria.Results: One hundred forty-five (145) unique genera were identified using molecular methods, and 68 of these genera were aerotolerant. Fourteen (14) unique genera were identified using aerobic culture methods. One-third (31/92) of the cultures were determined to be < 1% of the relative abundance of the wound microbiota using molecular testing. At the genus level, molecular testing identified 85% (78/92) of the bacteria that were identified by culture. Conversely, culturing detected 15.7% (78/497) of the aerotolerant bacteria and detected 54.9% of the collective aerotolerant relative abundance of the samples. Aerotolerant bacterial genera (and individual species including Staphylococcus aureus, Pseudomonas aeruginosa, and Enterococcus faecalis) with higher relative abundance scores were more likely to be detected by culture as demonstrated with regression modeling.Conclusion: Discordance between molecular and culture testing is often observed. However, culture-free 16S ribosomal DNA sequencing and its relative abundance score can provide clinicians with insight into which bacteria are most abundant in a sample and which are most likely to be detected by culture. © 2012 Rhoads et al.; licensee BioMed Central Ltd

    Investigating the Evidence of the Real-Life Impact of Acute Hyperglycaemia

    Get PDF
    Poorly controlled diabetes mellitus (DM) is associated with the development of long-term micro- and macro-vascular complications. The predominant focus of anti-diabetic therapy has been on lowering glycosylated haemoglobin levels, with a strong emphasis on fasting plasma glucose (particularly in Type 2 DM). There is considerable evidence indicating that post-meal hyperglycaemic levels are independently associated with higher risks of macro-vascular disease. Although some have identified mechanisms which may account for these observations, interventions which have specifically targeted postprandial glucose rises showed little or no effect in reducing cardiovascular risk. Clinical experience and some recent studies suggest acute hyperglycaemia affects cognition and other indicators of performance, equivalent to impairment seen during hypoglycaemia. In this brief report, we evaluated the published studies and argue that acute hyperglycaemia is worth investigating in relation to the real-life implications. In summary, evidence exists suggesting that acute hyperglycaemia may lead to impaired cognitive performance and productivity, but the relationship between these effects and daily activities remains poorly understood. Further research is required to enhance our understanding of acute hyperglycaemia in daily life. A better appreciation of clinically relevant effects of acute hyperglycaemia will allow us to determine whether it needs to be addressed by specific treatment

    A reference relative time-scale as an alternative to chronological age for cohorts with long follow-up

    Get PDF
    Background: Epidemiologists have debated the appropriate time-scale for cohort survival studies; chronological age or time-on-study being two such time-scales. Importantly, assessment of risk factors may depend on the choice of time-scale. Recently, chronological or attained age has gained support but a case can be made for a ‘reference relative time-scale’ as an alternative which circumvents difficulties that arise with this and other scales. The reference relative time of an individual participant is the integral of a reference population hazard function between time of entry and time of exit of the individual. The objective here is to describe the reference relative time-scale, illustrate its use, make comparison with attained age by simulation and explain its relationship to modern and traditional epidemiologic methods. Results: A comparison was made between two models; a stratified Cox model with age as the time-scale versus an un-stratified Cox model using the reference relative time-scale. The illustrative comparison used a UK cohort of cotton workers, with differing ages at entry to the study, with accrual over a time period and with long follow-up. Additionally, exponential and Weibull models were fitted since the reference relative time-scale analysis need not be restricted to the Cox model. A simulation study showed that analysis using the reference relative time-scale and analysis using chronological age had very similar power to detect a significant risk factor and both were equally unbiased. Further, the analysis using the reference relative time-scale supported fully-parametric survival modelling and allowed percentile predictions and mortality curves to be constructed. Conclusions: The reference relative time-scale was a viable alternative to chronological age, led to simplification of the modelling process and possessed the defined features of a good time-scale as defined in reliability theory. The reference relative time-scale has several interpretations and provides a unifying concept that links contemporary approaches in survival and reliability analysis to the traditional epidemiologic methods of Poisson regression and standardised mortality ratios. The community of practitioners has not previously made this connection

    Unsupervised Bayesian linear unmixing of gene expression microarrays

    Get PDF
    Background: This paper introduces a new constrained model and the corresponding algorithm, called unsupervised Bayesian linear unmixing (uBLU), to identify biological signatures from high dimensional assays like gene expression microarrays. The basis for uBLU is a Bayesian model for the data samples which are represented as an additive mixture of random positive gene signatures, called factors, with random positive mixing coefficients, called factor scores, that specify the relative contribution of each signature to a specific sample. The particularity of the proposed method is that uBLU constrains the factor loadings to be non-negative and the factor scores to be probability distributions over the factors. Furthermore, it also provides estimates of the number of factors. A Gibbs sampling strategy is adopted here to generate random samples according to the posterior distribution of the factors, factor scores, and number of factors. These samples are then used to estimate all the unknown parameters. Results: Firstly, the proposed uBLU method is applied to several simulated datasets with known ground truth and compared with previous factor decomposition methods, such as principal component analysis (PCA), non negative matrix factorization (NMF), Bayesian factor regression modeling (BFRM), and the gradient-based algorithm for general matrix factorization (GB-GMF). Secondly, we illustrate the application of uBLU on a real time-evolving gene expression dataset from a recent viral challenge study in which individuals have been inoculated with influenza A/H3N2/Wisconsin. We show that the uBLU method significantly outperforms the other methods on the simulated and real data sets considered here. Conclusions: The results obtained on synthetic and real data illustrate the accuracy of the proposed uBLU method when compared to other factor decomposition methods from the literature (PCA, NMF, BFRM, and GB-GMF). The uBLU method identifies an inflammatory component closely associated with clinical symptom scores collected during the study. Using a constrained model allows recovery of all the inflammatory genes in a single factor

    Kondo effect in an integer-spin quantum dot

    Full text link
    The Kondo effect is a key many-body phenomenon in condensed matter physics. It concerns the interaction between a localised spin and free electrons. Discovered in metals containing small amounts of magnetic impurities, it is now a fundamental mechanism in a wide class of correlated electron systems. Control over single, localised spins has become relevant also in fabricated structures due to the rapid developments in nano-electronics. Experiments have already demonstrated artificial realisations of isolated magnetic impurities at metallic surfaces, nanometer-scale magnets, controlled transitions between two-electron singlet and triplet states, and a tunable Kondo effect in semiconductor quantum dots. Here, we report an unexpected Kondo effect realised in a few-electron quantum dot containing singlet and triplet spin states whose energy difference can be tuned with a magnetic field. This effect occurs for an even number of electrons at the degeneracy between singlet and triplet states. The characteristic energy scale is found to be much larger than for the ordinary spin-1/2 case.Comment: 12 page

    CAP interacts with cytoskeletal proteins and regulates adhesion‐mediated ERK activation and motility

    Full text link
    Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/102140/1/emboj7601406-sup-0001.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/102140/2/emboj7601406.pd

    DTM ASSESMENT IN SLOPE INSTABILITY MODELING

    Get PDF
    Uma parte dos métodos de previsão de escorregamentos na paisagem tem como base principal a topografia, que pode ser gerada sob diferentes formas e ferramentas. Assim, este trabalho teve como objetivo avaliar a eficiência de dois Modelos Digitais de Terreno (MDT’s) baseados nos pontos LiDAR e em curvas de nível em um mapeamento de áreas suscetíveis a escorregamentos rasos. Para avaliar os MDT’s, fez-se uso do modelo em base física SHALSTAB. Os testes foram realizados em uma bacia hidrográfica afetada por escorregamentos rasos deflagrados após intensa precipitação, em março de 2011, na área urbana do município de Antonina (PR), na parte Sul da Serra do Mar. Os dados das propriedades físicas do solo necessários foram obtidos no interior de uma das cicatrizes de escorregamento de 2011. No intuito de avaliar o mapa de suscetibilidade, foram comparados os padrões espaciais das classes de instabilidade previstas pelo SHALSTAB com o mapa de cicatrizes. Dentre os resultados foi verificado que um dos índices de validação apresentou melhor performance do MDT derivado do LiDAR, enquanto que no segundo foi constatado uma reduzida diferença entre os MDT’s, sendo que ambos demostraram uma similar distribuição na frequência de classes.Part of the landslide prediction methods in the landscape is mainly based on the topography, which can be generated in different forms and tools. Thus, tis paper aimed to assess the efficiency of two sets of Digital Terrain Model (DTM), based on LiDAR data, and on traditional contour lines in a mapping of areas susceptible to shallow landslides. To evaluate the DTMs, we used the physically based model, SHALSTAB. The tests were carried out in a watershed affected by shallow landslides caused by intensive rainfall during March 2011, in the urban area of Antonina municipality (Parana State), the southern part of the Serra do Mar mountain range. The physical soil properties data needed for the model consisted of two sets of values (literature) measured from 2011 landslide scars. In order to validate the landslide susceptibility maps, we compared the spatial pattern of instability classes predicted by SHALSTAB with the mapped landslide scars. To evaluate the susceptibility map, we compared the spatial patterns of instability classes provided by SHALSTAB with the maps of scars. Among the results, it was verified that in one of the validation indexes presented a better performance of LiDAR-derived DTM, whereas, the second index was identified a small difference between DTMs, also both demonstrated a similar distribution of class frequency
    corecore